Really nice job @noppefoxwolf!
Post
Replies
Boosts
Views
Activity
It looks like Apple fixed the bug in Dictionary(uniqueKeysWithValues: ) at some point during this beta cycle. If you try now, the sample code should be working again!
Updated to add: the sample code is working, just unbelievably slowly.
It turns out the culprit is the use of a [Substring: Int] dictionary in loadVocabulary. For some reason, constructing a dictionary with Substring keys takes many orders of magnitude longer than using String keys.
Switching the implementation to [String: Int] solves the problem. When I have time, I'll file a bug for slow performance with Dictionaries using Substrings as keys.
I'm seeing a similar problem. The app just says "Searching" but never returns an answer. There's a log printed which could be a clue:
2022-07-03 08:42:45.859844-0500 FindingAnswers[32757:1495990] [default] failed to map shmem for window 0
It appears that CIRAWFilter (initialized with imageURL) does not respect its scaleFactor property at all. It always returns the full-size image. I'm going to file a bug today!
Hmm, that was a good idea, but sadly it didn't help. I'm still seeing a crash in the .onChange(of: ev) block when setting the exposure. The Objective-C runtime crashes every time I try to access any of CIRAWFilter's variables or methods after the first use.
I sort of couldn't believe this was an actual issue, but it seems to be impossible to create a pure SwiftUI view that pans and zooms at the same time!
This is one of the core multitouch behaviors which Apple has supported in Photos, Maps and Safari for fourteen years. Every Apple app which supports zooming an image also supports simultaneous panning.
I filed this serious UI bug as FB9488452 with the following sample code:
struct ContentView: View {
@GestureState var zoom = CGFloat(1.0)
@GestureState var pan = CGSize.zero
var body: some View {
VStack {
Image(systemName: "person")
.font(.largeTitle)
.padding()
.foregroundColor(.black)
}.frame(maxWidth: .infinity, maxHeight: .infinity)
.background(Color.white)
.scaleEffect(zoom)
.offset(pan)
.animation(.spring())
.gesture(
MagnificationGesture().updating($zoom){ (value, state, transaction) in
state = value
}.simultaneously(with: DragGesture())
.updating($pan){ (value, state, transaction) in
state = value.second?.translation ?? .zero
print("Translation: \(value.second?.translation ?? .zero)")
}
)
}
}
The above code should compose a gesture which allows simultaneous dragging and zooming, but only one of the gestures will succeed. It's as if they have been specified as Exclusive rather than Simultaneous. This is clearly a bug, as these gestures should absolutely be able to compose.
I'm also facing this issue. My iOS app does HDR-aware image processing, and I'm able to see HDR/EDR values from my CIFilters when applied to HDR video within SwiftUI's VideoPlayer.
However, there doesn't seem to be any way to display HDR photos on the iPhone with EDR, the way they appear in Photos.app. It is probably possible to produce a one-frame HDR video using the input photo, but I'm trying to avoid this hack.
Filed as FB9008409